Goto

Collaborating Authors

 alberto rodriguez


NeuralTouch: Neural Descriptors for Precise Sim-to-Real Tactile Robot Control

Lin, Yijiong, Deng, Bowen, Lu, Chenghua, Yang, Max, Psomopoulou, Efi, Lepora, Nathan F.

arXiv.org Artificial Intelligence

Abstract--Grasping accuracy is a critical prerequisite for precise object manipulation, often requiring careful alignment between the robot hand and object. Neural Descriptor Fields (NDF) offer a promising vision-based method to generate grasping poses that generalize across object categories. However, NDF alone can produce inaccurate poses due to imperfect camera calibration, incomplete point clouds, and object variability. Meanwhile, tactile sensing enables more precise contact, but existing approaches typically learn policies limited to simple, predefined contact geometries. In this work, we introduce NeuralT ouch, a multi-modal framework that integrates NDF and tactile sensing to enable accurate, generalizable grasping through gentle physical interaction. Our approach leverages NDF to implicitly represent the target contact geometry, from which a deep reinforcement learning (RL) policy is trained to refine the grasp using tactile feedback. This policy is conditioned on the neural descriptors and does not require explicit specification of contact types. Results show that NeuralT ouch significantly improves grasping accuracy and robustness over baseline methods, offering a general framework for precise, contact-rich robotic manipulation. I. INTRODUCTION A commonplace behaviour in humans is our ability to glance at an object to determine its general position and then use touch alone to grasp it with precision.


Tactile-Driven Non-Prehensile Object Manipulation via Extrinsic Contact Mode Control

Oller, Miquel, Berenson, Dmitry, Fazeli, Nima

arXiv.org Artificial Intelligence

In this paper, we consider the problem of non-prehensile manipulation using grasped objects. This problem is a superset of many common manipulation skills including instances of tool-use (e.g., grasped spatula flipping a burger) and assembly (e.g., screwdriver tightening a screw). Here, we present an algorithmic approach for non-prehensile manipulation leveraging a gripper with highly compliant and high-resolution tactile sensors. Our approach solves for robot actions that drive object poses and forces to desired values while obeying the complex dynamics induced by the sensors as well as the constraints imposed by static equilibrium, object kinematics, and frictional contact. Our method is able to produce a variety of manipulation skills and is amenable to gradient-based optimization by exploiting differentiability within contact modes (e.g., specifications of sticking or sliding contacts). We evaluate 4 variants of controllers that attempt to realize these plans and demonstrate a number of complex skills including non-prehensile planar sliding and pivoting on a variety of object geometries. The perception and controls capabilities that drive these skills are the building blocks towards dexterous and reactive autonomy in unstructured environments.


Fox News AI Newsletter: Doctor's groundbreaking surgery

FOX News

Rodriguez detailed that the MARS system gives surgeons "two extra arms" for instrument control, as well as camera stability. SURGICAL'REVOLUTION': Surgeon and CEO Dr. Alberto Rodriguez conducted the first-ever augmented reality (AR) abdominal surgery March 11 in Santiago, Chile. 'SCARY' SCHOOL TREND: Multiple Los Angeles-area school districts have investigated instances of "inappropriate," artificial intelligence-generated images of students circulating online and in text messages in recent months. AI IN PDF: Adobe announced that its new Acrobat artificial intelligence assistant will be available to Acrobat and Reader users starting on Tuesday. POTHOLE HEALER: Tech firm Robotiz3d is developing three technologies as part of its Autonomous Road Repair System.


Tactile-Filter: Interactive Tactile Perception for Part Mating

Ota, Kei, Jha, Devesh K., Tung, Hsiao-Yu, Tenenbaum, Joshua B.

arXiv.org Artificial Intelligence

Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks. Our tactile sensing provides us with a lot of information regarding contact formations as well as geometric information about objects during any interaction. With this motivation, vision-based tactile sensors are being widely used for various robotic perception and control tasks. In this paper, we present a method for interactive perception using vision-based tactile sensors for a part mating task, where a robot can use tactile sensors and a feedback mechanism using a particle filter to incrementally improve its estimate of objects (pegs and holes) that fit together. To do this, we first train a deep neural network that makes use of tactile images to predict the probabilistic correspondence between arbitrarily shaped objects that fit together. The trained model is used to design a particle filter which is used twofold. First, given one partial (or non-unique) observation of the hole, it incrementally improves the estimate of the correct peg by sampling more tactile observations. Second, it selects the next action for the robot to sample the next touch (and thus image) which results in maximum uncertainty reduction to minimize the number of interactions during the perception task. We evaluate our method on several part-mating tasks with novel objects using a robot equipped with a vision-based tactile sensor. We also show the efficiency of the proposed action selection method against a naive method. See supplementary video at https://www.youtube.com/watch?v=jMVBg_e3gLw .


Q&A: Alberto Rodriguez on teaching a robot to find your keys

#artificialintelligence

Growing up in Spain's Catalonia region, Alberto Rodriguez loved taking things apart and putting them back together. But it wasn't until he joined a robotics lab his last year in college that he realized robotics, and not mathematics or physics, would be his life's calling. "I fell in love with the idea that you could build something and then tell it what to do," he says. "That was my first intense exposure to the magic combo of building and coding, and I was hooked." After graduating from university in Barcelona, Rodriguez looked for a path to study in the United States.

  Country:
  Genre: Personal (0.36)

Robot That Senses Hidden Objects – "We're Trying to Give Robots Superhuman Perception"

#artificialintelligence

MIT researchers developed a picking robot that combines vision with radio frequency (RF) sensing to find and grasps objects, even if they're hidden from view. The technology could aid fulfilment in e-commerce warehouses. System uses penetrative radio frequency to pinpoint items, even when they're hidden from view. In recent years, robots have gained artificial vision, touch, and even smell. "Researchers have been giving robots human-like perception," says MIT Associate Professor Fadel Adib.